Bootstrap-adjusted quasi-likelihood information criteria for mixed model selection

نویسندگان

چکیده

We propose two model selection criteria relying on the bootstrap approach, denoted by QAICb1 and QAICb2, in framework of linear mixed models. Similar to justification Akaike Information Criterion (AIC), proposed QAICb2 are proved as asymptotically unbiased estimators Kullback–Leibler discrepancy between a candidate true model. However, they defined quasi-likelihood function instead likelihood proven be equivalent. The constructed bias estimation term which method is adopted improve for caused using estimate simulations across variety settings conducted demonstrate that outperform some other existing selecting Generalized estimating equations (GEE) utilized calculate simulations. effectiveness also demonstrated an application Parkinson's Progression Markers Initiative (PPMI) data.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric Bootstrap for Quasi-Likelihood Ratio Tests∗

We introduce a nonparametric bootstrap approach for Quasi-Likelihood Ratio type tests of nonlinear restrictions. Our method applies to extremum estimators, such as quasimaximum likelihood and generalized method of moments estimators. Unlike existing parametric bootstrap procedures for Quasi-Likelihood Ratio type tests, our procedure constructs bootstrap samples in a fully nonparametric way. We ...

متن کامل

Bootstrap Estimate of Kullback-leibler Information for Model Selection Bootstrap Estimate of Kullback-leibler Information for Model Selection

Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical model selection procedure which is based on likelihood principle like AIC. To discriminate nested models, we have to estimate it up to the order of constant while the Kullback-Leibler information itself is of the order of the number of observations. A correction term employed in AIC is an example to...

متن کامل

Model comparison with composite likelihood information criteria

Comparisons are made for the amount of agreement of the composite likelihood information criteria and their full likelihood counterparts when making decisions among the fits of different models, and some properties of penalty term for composite likelihood information criteria are obtained. Asymptotic theory is given for the case when a simpler model is nested within a bigger model, and the bigg...

متن کامل

Bootstrap variants of the Akaike information criterion for mixed model selection

Two bootstrap-corrected variants of the Akaike information criterion are proposed for the purpose of small-sample mixed model selection. These two variants are asymptotically equivalent, and provide asymptotically unbiased estimators of the expected Kullback-Leibler discrepancy between the true model and a fitted candidate model. The performance of the criteria is investigated in a simulation s...

متن کامل

Information criteria for astrophysical model selection

Model selection is the problem of distinguishing competing models, perhaps featuring different numbers of parameters. The statistics literature contains two distinct sets of tools, those based on information theory such as the Akaike Information Criterion (AIC), and those on Bayesian inference such as the Bayesian evidence and Bayesian Information Criterion (BIC). The Deviance Information Crite...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Applied Statistics

سال: 2022

ISSN: ['1360-0532', '0266-4763']

DOI: https://doi.org/10.1080/02664763.2022.2143484